Two-level optimization approach with accelerated proximal gradient for objective measures in sparse speech reconstruction
نویسندگان
چکیده
<p style='text-indent:20px;'>Compressive speech enhancement makes use of the sparseness and non-sparseness noise in time-frequency representation to perform enhancement. However, reconstructing sparsest output may not necessarily translate a good enhanced signal as distortion be at risk. This paper proposes two level optimization approach incorporate objective quality measures compressive The proposed method combines accelerated proximal gradient global one dimensional solve sparse reconstruction. By incorporating process, reconstructed is only but also maintains highest score possible. In other words, reconstruction process now Experimental results consistently show improvement objectives different noisy environments compared non-optimized method. Additionally, yields higher convergence rate with lower computational complexity existing methods.</p>
منابع مشابه
Proximal gradient algorithm for group sparse optimization
In this paper, we propose a proximal gradient algorithm for solving a general nonconvex and nonsmooth optimization model of minimizing the summation of a C1,1 function and a grouped separable lsc function. This model includes the group sparse optimization via lp,q regularization as a special case. Our algorithmic scheme presents a unified framework for several well-known iterative thresholding ...
متن کاملAn Adaptive Accelerated Proximal Gradient Method and its Homotopy Continuation for Sparse Optimization
We first propose an adaptive accelerated proximal gradient (APG) method for minimizing strongly convex composite functions with unknown convexity parameters. This method incorporates a restarting scheme to automatically estimate the strong convexity parameter and achieves a nearly optimal iteration complexity. Then we consider the l1regularized least-squares (l1-LS) problem in the high-dimensio...
متن کاملAn Adaptive Accelerated Proximal Gradient Method and its Homotopy Continuation for Sparse Optimization
We consider optimization problems with an objective function that is the sum of two convex terms: one is smooth and given by a black-box oracle, and the other is general but with a simple, known structure. We first present an accelerated proximal gradient (APG) method for problems where the smooth part of the objective function is also strongly convex. This method incorporates an efficient line...
متن کاملAccelerated Proximal Gradient Methods for Nonconvex Programming
Nonconvex and nonsmooth problems have recently received considerable attention in signal/image processing, statistics and machine learning. However, solving the nonconvex and nonsmooth optimization problems remains a big challenge. Accelerated proximal gradient (APG) is an excellent method for convex programming. However, it is still unknown whether the usual APG can ensure the convergence to a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Industrial and Management Optimization
سال: 2022
ISSN: ['1547-5816', '1553-166X']
DOI: https://doi.org/10.3934/jimo.2021131